13,476 research outputs found
Meteorological application of Apollo photography Final report
Development of meteorological information and parameters based on cloud photographs taken during Apollo 9 fligh
Gaussian Approximation Potentials: the accuracy of quantum mechanics, without the electrons
We introduce a class of interatomic potential models that can be
automatically generated from data consisting of the energies and forces
experienced by atoms, derived from quantum mechanical calculations. The
resulting model does not have a fixed functional form and hence is capable of
modeling complex potential energy landscapes. It is systematically improvable
with more data. We apply the method to bulk carbon, silicon and germanium and
test it by calculating properties of the crystals at high temperatures. Using
the interatomic potential to generate the long molecular dynamics trajectories
required for such calculations saves orders of magnitude in computational cost.Comment: v3-4: added new material and reference
Initial results from the Caltech/DRSI balloon-borne isotope experiment
The Caltech/DSRI balloonborne High Energy Isotope Spectrometer Telescope (HEIST) was flown successfully from Palestine, Texas on 14 May, 1984. The experiment was designed to measure cosmic ray isotopic abundances from neon through iron, with incident particle energies from approx. 1.5 to 2.2 GeV/nucleon depending on the element. During approximately 38 hours at float altitude, 100,000 events were recorded with Z or = 6 and incident energies approx. 1.5 GeV/nucleon. We present results from the ongoing data analysis associated with both the preflight Bevalac calibration and the flight data
Examination of offsite radiological emergency protective measures for nuclear reactor accidents involving core melt
"Date published: June 1978. --Reissued: October 1979."MITNE series handwritten on title-page"SAND78-0454."Originally issued as a Ph. D. thesis by the first author and supervised by the second and third author, MIT, Dept. of Nuclear Engineering, 1978Originally issued as anIncludes bibliographical referencesEvacuation, sheltering followed by population relocation, and iodine prophylaxis are evaluated as offsite public protective measures in response to nuclear reactor accidents involving core-melt. Evaluations were conducted using a modified version of the Reactor Safety Study consequence model. Models representing each measure were developed and are discussed. Potential PWR core-melt radioactive material releases are separated into two categories, "Melt-through" and "Atmospheric," based upon the mode of containment failure. Protective measures are examined and compared for each category in terms of projected doses to the whole body and thyroid. Measures for "Atmospheric" accidents are also examined in terms of their influence on the occurrence of public health effects. For "Melt-through" accidents, few, if any, early public health effects are likely, and doses in excess of Protective Action Guides (PAGs) are "confined" to areas within 10 miles of the reactor.Evacuation appears to provide the largest reduction in whole body dose for this category. However, sheltering, particularly when basements are readily available, may be an acceptable alternative. Both evacuation and iodine prophylaxis can substantially reduce the dose to the thyroid. For "Atmospheric" accidents, PAGs are likely to be exceeded at very large distances, and significant numbers of early public health effects are possible. However, most early fatalities occur within 10 miles of the reactor. Within 5 miles, evacuation appears to be more effective than sheltering in reducing the number of early health effects. Beyond 5 miles, this distinction is less, or not, apparent. Within 10 miles, early health effects are strongly influenced by the speed and efficiency with which protective measures are implemented. Outside of 10 miles, they are not.The projected total number of thyroid nodules is not substantially reduced unless iodine prophylaxis is administered over very large areas (distances). The qualitative effects of weather conditions on the above conclusions are also briefly discussed.Prepared for Office of Nuclear Regulatory Research, Probabilistic Staff, U.S. Nuclear Regulatory Commission, under Interagency Agreement DOE-40-550-75 NRC FIN no. A103
Efficient Bayesian hierarchical functional data analysis with basis function approximations using Gaussian-Wishart processes
Functional data are defined as realizations of random functions (mostly
smooth functions) varying over a continuum, which are usually collected with
measurement errors on discretized grids. In order to accurately smooth noisy
functional observations and deal with the issue of high-dimensional observation
grids, we propose a novel Bayesian method based on the Bayesian hierarchical
model with a Gaussian-Wishart process prior and basis function representations.
We first derive an induced model for the basis-function coefficients of the
functional data, and then use this model to conduct posterior inference through
Markov chain Monte Carlo. Compared to the standard Bayesian inference that
suffers serious computational burden and unstableness for analyzing
high-dimensional functional data, our method greatly improves the computational
scalability and stability, while inheriting the advantage of simultaneously
smoothing raw observations and estimating the mean-covariance functions in a
nonparametric way. In addition, our method can naturally handle functional data
observed on random or uncommon grids. Simulation and real studies demonstrate
that our method produces similar results as the standard Bayesian inference
with low-dimensional common grids, while efficiently smoothing and estimating
functional data with random and high-dimensional observation grids where the
standard Bayesian inference fails. In conclusion, our method can efficiently
smooth and estimate high-dimensional functional data, providing one way to
resolve the curse of dimensionality for Bayesian functional data analysis with
Gaussian-Wishart processes.Comment: Under revie
A method for risk analysis of nuclear reactor accidents
Originally presented as the first author's thesis, (Ph. D.)--in the M.I.T. Dept. of Nuclear Engineering, 1976Includes bibliographical references (pages 207-208)Prepared for the U.S. Nuclear Regulatory Commission, Office of Nuclear Regulatory Research no. AT(49-24)-026
X-Ray Emission from the Warm Hot Intergalactic Medium
The number of detected baryons in the Universe at z<0.5 is much smaller than
predicted by standard big bang nucleosynthesis and by the detailed observation
of the Lyman alpha forest at red-shift z=2. Hydrodynamical simulations indicate
that a large fraction of the baryons today is expected to be in a ``warm-hot''
(10^5-10^7K) filamentary gas, distributed in the intergalactic medium. This
gas, if it exists, should be observable only in the soft X-ray and UV bands.
Using the predictions of a particular hydrodynamic model, we simulated the
expected X-ray flux as a function of energy in the 0.1-2 keV band due to the
Warm-Hot Intergalactic Medium (WHIM), and compared it with the flux from local
and high red-shift diffuse components. Our results show that as much as 20% of
the total diffuse X-ray background (DXB) in the energy range 0.37-0.925keV
could be due to X-ray flux from the WHIM, 70% of which comes from filaments at
redshift z between 0.1 and 0.6. Simulations done using a FOV of 3', comparable
with that of Suzaku and Constellation-X, show that in more than 20% of the
observations we expect the WHIM flux to contribute to more than 20% of the DXB.
These simulations also show that in about 10% of all the observations a single
bright filament in the FOV accounts, alone, for more than 20% of the DXB flux.
Red-shifted oxygen lines should be clearly visible in these observations.Comment: 19 pages, 6 figure
Time-varying Learning and Content Analytics via Sparse Factor Analysis
We propose SPARFA-Trace, a new machine learning-based framework for
time-varying learning and content analytics for education applications. We
develop a novel message passing-based, blind, approximate Kalman filter for
sparse factor analysis (SPARFA), that jointly (i) traces learner concept
knowledge over time, (ii) analyzes learner concept knowledge state transitions
(induced by interacting with learning resources, such as textbook sections,
lecture videos, etc, or the forgetting effect), and (iii) estimates the content
organization and intrinsic difficulty of the assessment questions. These
quantities are estimated solely from binary-valued (correct/incorrect) graded
learner response data and a summary of the specific actions each learner
performs (e.g., answering a question or studying a learning resource) at each
time instance. Experimental results on two online course datasets demonstrate
that SPARFA-Trace is capable of tracing each learner's concept knowledge
evolution over time, as well as analyzing the quality and content organization
of learning resources, the question-concept associations, and the question
intrinsic difficulties. Moreover, we show that SPARFA-Trace achieves comparable
or better performance in predicting unobserved learner responses than existing
collaborative filtering and knowledge tracing approaches for personalized
education
FRANTIC 5 (a version of FRANTIC II) : a computer code for evaluating system aging effect
"November 1986."Includes bibliographical referencesThe FRANTIC 5 code is a modification of the FRANTIC II code for time dependent unavailability analysis. FRANTIC 5 is specially adapted for modeling the aging effects on system and component performance. The FRANTIC 5 code uses the linear aging model, i.e., based on the assumption that component failure rates increase linearly in time. The constant failure rate and the aging acceleration rate for a component can be changed during the plant life, which allows the creation of different time scales for components as a function of the replacement or any significant maintenance or repair action on the component. FRANTIC 5 preserves most of the unique features of FRANTIC II, for example the modeling of periodic testing. The output from FRANTIC 5 consists of the system mean unavailabilities, tables of the system unavailabilities at designated time points and the system mean unavailabilities between consecutive tests. The code is applied for evaluation of aging effects of the Auxiliary Feedwater System of Arkansas Nuclear Unit 1. The usefulness of the method will depend upon the availability of the component aging data needed to develop the model parameters.Prepared for EG&G Idaho, Inc. special research subcontract no. C86-10094
Time dependent unavailability analysis to standby safety systems
"Prepared for Brookhaven National Laboratory."Includes bibliographical references (p. 280-284)Contract no. BNL-54668
- …